26 research outputs found

    Developing a Lightweight Rock-Paper-Scissors Framework for Human-Robot Collaborative Gaming

    Full text link
    © 2013 IEEE. We present a novel implementation of a Rock-Paper-Scissors (RPS) game interaction with a social robot. The framework is tailored to be computationally lightweight, as well as entertaining and visually appealing through collaboration with designers and animators. The fundamental gesture recognition pipeline employs a Leap motion device and two separate machine learning architectures to evaluate kinematic hand data on-the-fly. The first architecture is used to recognize and segment human motion activity in order to initialize the RPS play, and the second architecture is used to classify hand gestures into rock, paper or scissors. The employed tabletop robot interacts in the RPS play through unique animated gestural movements and vocalizations designed by animators which communicate the robot's choices as well as cognitive reflection on winning, losing and draw states. Performance of both learning architectures is carefully evaluated with respect to accuracy, reliability and run time performance under different feature and classifier types. Moreover, we assess our system during an interactive RPS play between robot and human. Experimental results show that the proposed system is robust to user variations and play style in real environment conditions. As such, it offers a powerful application for the subsequent exploration of social human-machine interaction

    Optimization of Planck/LFI on--board data handling

    Get PDF
    To asses stability against 1/f noise, the Low Frequency Instrument (LFI) onboard the Planck mission will acquire data at a rate much higher than the data rate allowed by its telemetry bandwith of 35.5 kbps. The data are processed by an onboard pipeline, followed onground by a reversing step. This paper illustrates the LFI scientific onboard processing to fit the allowed datarate. This is a lossy process tuned by using a set of 5 parameters Naver, r1, r2, q, O for each of the 44 LFI detectors. The paper quantifies the level of distortion introduced by the onboard processing, EpsilonQ, as a function of these parameters. It describes the method of optimizing the onboard processing chain. The tuning procedure is based on a optimization algorithm applied to unprocessed and uncompressed raw data provided either by simulations, prelaunch tests or data taken from LFI operating in diagnostic mode. All the needed optimization steps are performed by an automated tool, OCA2, which ends with optimized parameters and produces a set of statistical indicators, among them the compression rate Cr and EpsilonQ. For Planck/LFI the requirements are Cr = 2.4 and EpsilonQ <= 10% of the rms of the instrumental white noise. To speedup the process an analytical model is developed that is able to extract most of the relevant information on EpsilonQ and Cr as a function of the signal statistics and the processing parameters. This model will be of interest for the instrument data analysis. The method was applied during ground tests when the instrument was operating in conditions representative of flight. Optimized parameters were obtained and the performance has been verified, the required data rate of 35.5 Kbps has been achieved while keeping EpsilonQ at a level of 3.8% of white noise rms well within the requirements.Comment: 51 pages, 13 fig.s, 3 tables, pdflatex, needs JINST.csl, graphicx, txfonts, rotating; Issue 1.0 10 nov 2009; Sub. to JINST 23Jun09, Accepted 10Nov09, Pub.: 29Dec09; This is a preprint, not the final versio

    Herschel FIR counterparts of selected Ly-alpha emitters at z~2.2. Fast evolution since z~3 or missed obscured AGNs?

    Get PDF
    Ly-alpha emitters (LAEs) are seen everywhere in the redshift domain from local to z~7. Far-infrared (FIR) counterparts of LAEs at different epochs could provide direct clues on dust content, extinction, and spectral energy distribution (SED) for these galaxies. We search for FIR counterparts of LAEs that are optically detected in the GOODS-North field at redshift z~2.2 using data from the Herschel Space Telescope with the Photodetector Array Camera and Spectrometer (PACS). The LAE candidates were isolated via color-magnitude diagram using the medium-band photometry from the ALHAMBRA Survey, ancillary data on GOODS-North, and stellar population models. According to the fitting of these spectral synthesis models and FIR/optical diagnostics, most of them seem to be obscured galaxies whose spectra are AGN-dominated. From the analysis of the optical data, we have observed a fraction of AGN or composite over source total number of ~0.75 in the LAE population at z~2.2, which is marginally consistent with the fraction previously observed at z=2.25 and even at low redshift (0.2<z<0.45), but significantly different from the one observed at redshift ~3, which could be compatible either with a scenario of rapid change in the AGN fraction between the epochs involved or with a non detection of obscured AGN in other z=2-3 LAE samples due to lack of deep FIR observations. We found three robust FIR (PACS) counterparts at z~2.2 in GOODS-North. This demonstrates the possibility of finding dust emission in LAEs even at higher redshifts.Comment: 11 pages (including Appendices), 6 figures. Accepted for publication in Astronomy & Astrophysics Letters (two references added

    Data mining for software engineering and humans in the loop

    Get PDF
    The field of data mining for software engineering has been growing over the last decade. This field is concerned with the use of data mining to provide useful insights into how to improve software engineering processes and software itself, supporting decision-making. For that, data produced by software engineering processes and products during and after software development are used. Despite promising results, there is frequently a lack of discussion on the role of software engineering practitioners amidst the data mining approaches. This makes adoption of data mining by software engineering practitioners difficult. Moreover, the fact that experts’ knowledge is frequently ignored by data mining approaches, together with the lack of transparency of such approaches, can hinder the acceptability of data mining by software engineering practitioners. To overcome these problems, this position paper provides a discussion of the role of software engineering experts when adopting data mining approaches. It also argues that this role can be extended to increase experts’ involvement in the process of building data mining models. We believe that such extended involvement is not only likely to increase software engineers’ acceptability of the resulting models, but also improve the models themselves. We also provide some recommendations aimed at increasing the success of experts involvement and model acceptability

    The European Solar Telescope

    Get PDF
    The European Solar Telescope (EST) is a project aimed at studying the magnetic connectivity of the solar atmosphere, from the deep photosphere to the upper chromosphere. Its design combines the knowledge and expertise gathered by the European solar physics community during the construction and operation of state-of-the-art solar telescopes operating in visible and near-infrared wavelengths: the Swedish 1m Solar Telescope, the German Vacuum Tower Telescope and GREGOR, the French Télescope Héliographique pour l'Étude du Magnétisme et des Instabilités Solaires, and the Dutch Open Telescope. With its 4.2 m primary mirror and an open configuration, EST will become the most powerful European ground-based facility to study the Sun in the coming decades in the visible and near-infrared bands. EST uses the most innovative technological advances: the first adaptive secondary mirror ever used in a solar telescope, a complex multi-conjugate adaptive optics with deformable mirrors that form part of the optical design in a natural way, a polarimetrically compensated telescope design that eliminates the complex temporal variation and wavelength dependence of the telescope Mueller matrix, and an instrument suite containing several (etalon-based) tunable imaging spectropolarimeters and several integral field unit spectropolarimeters. This publication summarises some fundamental science questions that can be addressed with the telescope, together with a complete description of its major subsystems

    HARMONI at ELT: overview of the capabilities and expected performance of the ELT's first light, adaptive optics assisted integral field spectrograph.

    Get PDF

    The European Solar Telescope

    Get PDF
    The European Solar Telescope (EST) is a project aimed at studying the magnetic connectivity of the solar atmosphere, from the deep photosphere to the upper chromosphere. Its design combines the knowledge and expertise gathered by the European solar physics community during the construction and operation of state-of-the-art solar telescopes operating in visible and near-infrared wavelengths: the Swedish 1m Solar Telescope, the German Vacuum Tower Telescope and GREGOR, the French Télescope Héliographique pour l’Étude du Magnétisme et des Instabilités Solaires, and the Dutch Open Telescope. With its 4.2 m primary mirror and an open configuration, EST will become the most powerful European ground-based facility to study the Sun in the coming decades in the visible and near-infrared bands. EST uses the most innovative technological advances: the first adaptive secondary mirror ever used in a solar telescope, a complex multi-conjugate adaptive optics with deformable mirrors that form part of the optical design in a natural way, a polarimetrically compensated telescope design that eliminates the complex temporal variation and wavelength dependence of the telescope Mueller matrix, and an instrument suite containing several (etalon-based) tunable imaging spectropolarimeters and several integral field unit spectropolarimeters. This publication summarises some fundamental science questions that can be addressed with the telescope, together with a complete description of its major subsystems

    The Planck-LFI radiometer electronics box assembly

    No full text
    The Radiometer Electronics Box Assembly (REBA) is the control and data processing on board computer of the Low Frequency Instrument (LFI) of the Planck mission (ESA). The REBA was designed and built incorporating state of the art processors, communication interfaces and real time operating system software in order to meet the scientific performance of the LFI. We present a technical summary of the REBA, including a physical, functional, electrical, mechanical and thermal description. Aspects of the design and development, the assembly, the integration and the verification of the equipment are provided. A brief description of the LFI on board software is given including the Low-Level Software and the main functionalities and architecture of the Application Software. The compressor module, which has been developed as an independent product, later integrated in the application, is also described in this paper. Two identical engineering models EM and AVM, the engineering qualification model EQM, the flight model FM and flight spare have been manufactured and tested. Low-level and Application software have been developed. Verification activities demonstrated that the REBA hardware and software fulfil all the specifications and perform as required for flight operation

    Evaluating the governance model of hardware-dependent software ecosystems - a case study of the axis ecosystem

    No full text
    Ecosystem governance becomes gradually more relevant for a set of companies or actors characterized by symbiotic relations evolved on the top of a technological platform, i.e. a software ecosystem. In this study, we focus on the governance of a hardware-dependent software ecosystem. More specifically, we evaluate the governance model applied by Axis, a network video and surveillance camera producer, that is the platform owner and orchestrator of the Application Development Partner (ADP) software ecosystem. We conduct an exploratory case study col- lecting data from observations and interviews and apply the governance model for prevention and improvement of the software ecosystem health proposed by Jansen and Cusumano. Our results reveal that although the governance actions do not address the majority of their governance model, the ADP ecosystem is considered a growing ecosystem provid- ing opportunities for its actors. This can be explained by the fact that Axis, as the orchestrator and the platform owner, does not address the productivity and robustness of the ecosystem adequately, but has a net- work of vendors and resellers to support it and some of the governance activities (e.g. communication) are achieved by non-formal means. The current governance model does not take into consideration
    corecore